Chapter 2 ON THE ARITHMETIC PRECISION FOR IMPLEMENTING BACK-PROPAGATION NETWORKS ON FPGA: A CASE STUDY

نویسندگان

  • Medhat Moussa
  • Shawki Areibi
  • Kristian Nichols
چکیده

Artificial Neural Networks (ANNs) are inherently parallel architectures which represent a natural fit for custom implementation on FPGAs. One important implementation issue is to determine the numerical precision format that allows an optimum tradeoff between precision and implementation areas. Standard single or double precision floating-point representations minimize quantization errors while requiring significant hardware resources. Less precise fixed-point representation may require less hardware resources but add quantization errors that may prevent learning from taking place, especially in regression problems. This chapter examines this issue and reports on a recent experiment where we implemented a Multi-layer perceptron (MLP) on an FPGA using both fixed and floating point precision. Results show that the fixed-point MLP implementation was over 12x greater in speed, over 13x smaller in area, and achieves far greater processing density compared to the floating-point FPGA-based MLP.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Floating Point Multiplier based FPGA Synthesis for Neural Networks Enhancement

FPGA (Field Programmable Gate Array) implementation of Artificial Neural Networks (ANNs) calls for multipliers of various word lengths. In this paper, a new approach for designing a FloatingPoint Multiplier (FPM) is developed and tested using VHDL. With VHDL (Very High Description Language) analyzer and logic synthesis software, hardware prototypes could be implemented in FPGA. I. Introduction...

متن کامل

On the use of back propagation and radial basis function neural networks in surface roughness prediction

Various artificial neural networks types are examined and compared for the prediction of surface roughness in manufacturing technology. The aim of the study is to evaluate different kinds of neural networks and observe their performance and applicability on the same problem. More specifically, feed-forward artificial neural networks are trained with three different back propagation algorithms, ...

متن کامل

Geoid Determination Based on Log Sigmoid Function of Artificial Neural Networks: (A case Study: Iran)

A Back Propagation Artificial Neural Network (BPANN) is a well-known learning algorithmpredicated on a gradient descent method that minimizes the square error involving the networkoutput and the goal of output values. In this study, 261 GPS/Leveling and 8869 gravity intensityvalues of Iran were selected, then the geoid with three methods “ellipsoidal stokes integral”,“BPANN”, and “collocation” ...

متن کامل

Prediction of methanol loss by hydrocarbon gas phase in hydrate inhibition unit by back propagation neural networks

Gas hydrate often occurs in natural gas pipelines and process equipment at high pressure and low temperature. Methanol as a hydrate inhibitor injects to the potential hydrate systems and then recovers from the gas phase and re-injects to the system. Since methanol loss imposes an extra cost on the gas processing plants, designing a process for its reduction is necessary. In this study, an accur...

متن کامل

Reconfigurable arithmetic for HPC

An often overlooked way to increase the efficiency of HPC on FPGA is to tailor, as tightly as possible, the arithmetic to the application. An ideally efficient implementation would, for each of its operations, toggle and transmit just the number of bits required by the application at this point. Conventional microprocessors, with their word-level granularity and fixed memory hierarchy, keep us ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017